Comparison of 2D/3D Features and Their Adaptive Score Level Fusion for 3D Face Recognition
نویسندگان
چکیده
3D face has been introduced in the literature to deal with the unsolved issues of 2D face recognition, namely lighting and pose variations. In this paper, we study and compare the distinctiveness of features extracted from both the registered 2D face images and 3D face models. Sparse Representation Classifier (SRC) is exploited to calculate all similarity measures which are compared with the ones by a baseline of Nearest Neighbor (NN). As individual features of 2D and 3D are far from distinctive for discriminating human faces, we further present an adaptive score level fusion strategy for multimodal 2D-3D face recognition. The novel fusion strategy consists of an offline and an online weight learning process, both of which automatically select the most relevant weights of all the scores for each probe face in each modality. The weights calculated offline are based on the EER value of each type of features, while the online ones are dynamically obtained according to matching scores. Both types of weights are then fused to generate a final weight. Tested on the complete FRGC v2.0 dataset, the best rank-one recognition rate using only 3D or 2D features is 79.72% and 77.89%, respectively; while the new proposed adaptive fusion strategy achieves 95.48% with a 97.03% verification rate at 0.001 FAR, highlighting the benefit of exploring both 3D and 2D clues as well as the effectiveness of our adaptive fusion strategy.
منابع مشابه
Hybridization of Facial Features and Use of Multi Modal Information for 3D Face Recognition
Despite of achieving good performance in controlled environment, the conventional 3D face recognition systems still encounter problems in handling the large variations in lighting conditions, facial expression and head pose The humans use the hybrid approach to recognize faces and therefore in this proposed method the human face recognition ability is incorporated by combining global and local ...
متن کاملHand Gesture Recognition from RGB-D Data using 2D and 3D Convolutional Neural Networks: a comparative study
Despite considerable enhances in recognizing hand gestures from still images, there are still many challenges in the classification of hand gestures in videos. The latter comes with more challenges, including higher computational complexity and arduous task of representing temporal features. Hand movement dynamics, represented by temporal features, have to be extracted by analyzing the total fr...
متن کاملA Feature-level Fusion of Appearance and Passive Depth Information for Face Recognition
Face recognition using 2D intensity/colour images have been extensively researched over the past two decades (Zhao et al., 2003). More recently, some in-roads into 3D recognition have been investigated by others (Bowyer et al., 2006). However, both the 2D and 3D face recognition paradigm have their respective strengths and weaknesses. 2D face recognition methods suffer from variability in pose ...
متن کاملLearning to Fuse 3D+2D Based Face Recognition at Both Feature and Decision Levels
2D intensity images and 3D shape models are both useful for face recognition, but in different ways. While algorithms have long been developed using 2D or 3D data, recently has seen work on combining both into multi-modal face biometrics to achieve higher performance. However, the fusion of the two modalities has mostly been at the decision level, based on scores obtained from independent 2D an...
متن کامل2D&3D-ComFusFace: 2D and 3D Face Recognition by Scalable Fusion of Common Features
In traditional 2D and 3D face recognition systems, different features are extracted from 2D and 3D face images, and then are fused to improve the recognition performance. The shortage of these methods is that they neglect the intrinsic complementary features between 2D and 3D data. In this paper, we investigate the possibility of extracting and scalable fusing common features from 2D intensity ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2010